Cross Validation in Compressed Sensing via the Johnson Lindenstrauss Lemma

نویسنده

  • Rachel Ward
چکیده

Compressed Sensing decoding algorithms aim to reconstruct an unknown N dimensional vector x from m < N given measurements y = Φx, with an assumed sparsity constraint on x. All algorithms presently are iterative in nature, producing a sequence of approximations (s1, s2, ...) until a certain algorithm-specific stopping criterion is reached at iteration j∗, at which point the estimate x̂ = sj∗ is returned as an approximation to x. In many algorithms, the error ||x−x̂||lN 2 of the approximation is bounded above by a function of the error between x and the best k-term approximation to x. However, as x is unknown, such estimates provide no numerical bounds on the error. In this paper, we demonstrate that tight numerical upper and lower bounds on the error ||x − sj ||lN 2 for j ≤ p iterations of a compressed sensing decoding algorithm are attainable with little effort. More precisely, we assume a maximum iteration length of p is pre-imposed; we reserve 4 log p of the original m measurements and compute the sj from the m − 4 log(p) remaining measurements; the errors ||x − sj ||lN 2 , for j = 1, ..., p can then be bounded with high probability. As a consequence, a numerical upper bound on the error between x and the best k-term approximation to x can be estimated with almost no cost. Our observation has applications outside of Compressed Sensing as well.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Johnson-Lindenstrauss Lemma Meets Compressed Sensing

We show how two fundamental results in analysis related to n-widths and Compressed Sensing are intimately related to the Johnson-Lindenstrauss lemma. Our elementary approach is based on the same concentration inequalities for random inner products that have recently provided simple proofs of the Johnson-Lindenstrauss lemma. We show how these ideas lead to simple proofs of Kashin’s theorems on w...

متن کامل

A Simple Proof of the Restricted Isometry Property for Random Matrices

We give a simple technique for verifying the Restricted Isometry Property (as introduced by Candès and Tao) for random matrices that underlies Compressed Sensing. Our approach has two main ingredients: (i) concentration inequalities for random inner products that have recently provided algorithmically simple proofs of the Johnson–Lindenstrauss lemma; and (ii) covering numbers for finite-dimensi...

متن کامل

A strong restricted isometry property, with an application to phaseless compressed sensing

The many variants of the restricted isometry property (RIP) have proven to be crucial theoretical tools in the fields of compressed sensing and matrix completion. The study of extending compressed sensing to accommodate phaseless measurements naturally motivates a strong notion of restricted isometry property (SRIP), which we develop in this paper. We show that if A ∈ Rm×n satisfies SRIP and ph...

متن کامل

236779: Foundations of Algorithms for Massive Datasets Lecture 4 the Johnson-lindenstrauss Lemma

The Johnson-Lindenstrauss lemma and its proof This lecture aims to prove the Johnson–Lindenstrauss lemma. Since the lemma is proved easily with another interesting lemma, a part of this lecture is focused on the proof of this second lemma. At the end, the optimality of the Johnson–Lindenstrauss lemma is discussed. Lemma 1 (Johnson-Lindenstrauss). Given the initial space X ⊆ R n s.t. |X| = N , <...

متن کامل

Robustness Properties of Dimensionality Reduction with Gaussian Random Matrices

In this paper we study the robustness properties of dimensionality reduction with Gaussian random matrices having arbitrarily erased rows. We first study the robustness property against erasure for the almost norm preservation property of Gaussian random matrices by obtaining the optimal estimate of the erasure ratio for a small given norm distortion rate. As a consequence, we establish the rob...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008